• Posted on October 30, 2017 10:29 am
    Joseph Forbes
    No comments

    Most broadband Internet connections stay "always on" - keeping you online at all times. Home network owners often leave their router, broadband modems and other gear powered up and operating constantly, even when not using them for long periods of time, for the sake of convenience. But is it really a good idea to keep home network equipment always connected? Consider the pros and cons. Advantages of Powering Down Home Networks Security: Powering off your gear when not using it improves your network security. When network devices are offline, hackers and Wi-Fi wardrivers cannot target them. Other security measures like firewalls help and are necessary but not bulletproof. Savings on utility bills: Powering down computers, routers and modems saves money. In some countries, the savings is low, but in other parts of the world, utility costs are significant. Surge protection: Unplugging network devices prevents them from being damaged by electric power surges. Surge protectors can also prevent this kind of damage; however, surge units (particularly the inexpensive ones) cannot always protect against major power spikes like those from lightning strikes. Disadvantages of Powering Down Home Networks Noise reduction: Networking gear is much quieter than it was years ago before loud built-in fans were replaced with solid state cooling systems. Your senses might be adjusted to the relatively low levels of home network noise, but you might also be pleasantly surprised at the added tranquility of a residence without it. Hardware reliability: Frequently power cycling a computer or other networked device can shorten its working life due to the extra stress involved. Disk drives are particularly susceptible to damage. On the other hand, high temperature also greatly reduces the lifetime of network equipment. Leaving equipment always-on very possibly causes more damage from heat than will powering it down occasionally. Communication reliability: After power cycling, network connections may sometimes fail to reestablish. Special care must be taken to follow proper start-up procedures. For example, broadband modems generally should be powered on first, then other devices only later, after the modem is ready. Convenience: Network devices like routers and modems may be installed on ceilings, in basements or other hard-to-reach places. You should shut down these devices gracefully, using the manufacturer-recommend procedure, rather than merely "pulling the plug." Powering down a network takes time to do properly and may seem an inconvenience at first. The Bottom Line Home network gear need not be powered on and connected to the Internet at all times. All things considered, turning off your network during extended periods of non-use is a good idea. The security benefit alone makes this a worthwhile endeavor. Because computer networks can be difficult to set up initially, some people naturally fear disrupting it once working. In the long run, though, this practice will increase your confidence and peace of mind as a home network administrator.

    Blog Entry, Hacking, Hardware
  • Posted on October 28, 2017 10:15 am
    Joseph Forbes
    No comments

    Leave your computer on all the time, or shut it off when it's not in use; does it really make a difference? If you've been asking yourself this question, then you'll be happy to hear that you can choose whichever way you want. You just need to understand the ramifications of your choice, and take a few precautions to ensure you get the longest life you can from your computer. The most important precaution is to add a UPS (Uninterruptible Power Supply), no matter which method you choose.  A UPS can protect your computer from many of the dangers it's likely to face. The Things That Can Harm Your Computer All of the parts that make up your computer have a limited lifetime. The processor, RAM, and graphics cards all experience aging caused by, among other things, heat and temperature. Additional failure modes come from the stress of cycling a computer on and off. But it's not just your computer's semiconductors that are affected. Mechanical components, such as the ones in hard drives, optical drives, printers, and scanners, are all affected by the power cycling they may undergo when your computer is turned off or on. In many cases, peripherals, such as printers and external drives, may have circuitry that senses when your computer is powered on or off, and initiates the same condition, turning the device on or off as needed. There are other failure modes to consider that originate externally to your computer. The one most often mentioned is a power surge and power drop, where there's a sudden rise or fall in voltage on the electrical circuit that your computer is plugged into. We often associate these surges with transient events, such as nearby lightning strikes, or devices that use a lot of power at once (vacuum cleaner, hair dryer, etc). All of these failure types need to be considered. Leaving a computer turned on can reduce exposure to some of the failure types, while turning your computer off can prevent most of the external vectors that can cause the failure of a computer's components. The question then becomes, which is best: on or off? Turns out, at least in my opinion, it’s a bit of both. If your goal is to maximize lifetime, there's a time period when turning a new computer on and off makes sense; later, leaving it on 24/7 makes sense. Computer Life Testing and Failure Rates There are various failure modes that can result in your computer, well, failing. Computer manufacturers have a few tricks up their sleeves to reduce the failure rate seen by end users. What makes this interesting is that assumptions made by the manufacturer regarding warranty periods can be upset by the decision to leave a computer on 24/7; let's find out why. Computer and component manufacturers use various tests to ensure the quality of their products. One of these is known as Life testing, which uses a burn-in process that accelerates the aging rate of a device under test by cycling power, running devices at elevated voltage and temperature, and exposing the devices to conditions beyond the environment they were intended to operate in. Manufacturers found that devices that survived their infancy would continue to operate without problems until their expected lifetime was reached. Devices in their middle years rarely failed, even when exposed to conditions just outside their expected operating range. The graph demonstrating failure rate over time become known as the bathtub curve, because it looked like a bathtub viewed from the side. Components fresh off the manufacturing line would display a high failure rate when first turned on. That failure rate would drop quickly, so that in a short time, a steady but extremely low failure rate would occur over the remaining expected years. Near the end of the component's life, the failure rate would start to rise again, until it quickly reached a very high failure rate, such as that seen near the beginning of the component's life. Life testing showed that components were highly reliable once they were beyond the infancy period. Manufacturers would then offer their components after using a burn-in process that aged the devices beyond the infancy period. Customers who needed high reliability would pay extra for these burned-in devices. Typical customers for this service included the military, NASA contractors, aviation, and medical. Devices that did not go through a complex burn-in process were sold mostly for consumer use, but the manufacturers included a warranty whose time frame usually matched or exceeded the infancy time on the bathtub curve. Turning your computer off every night, or when not in use, would seem like it could be a cause for component failure, and it's true that as your computer ages, it's likely to fail when turned off or on. But it's certainly a bit counterintuitive to learn that putting stress on your system when it's young, and under warranty, may be a good thing. Remember the bathtub curve, which says that early device failure is more likely when the components are very young, and that as they age, failure rates drop? If you remove some of the expected types of stress by never power cycling your computer, you slow down the aging process. In essence, you extend the length of time the device remains susceptible to early failures. When your computer is under warranty, it may be advantageous to provide a modicum of stress by turning your computer off when not in use, so that any failure that occurs because of turn on/turn off stress happens under warranty. Leaving your computer turned on 24/7 can remove a few of the known stress events that lead to component failure, including the in-rush of current that can damage some devices, voltage swings, and surges that occur when turning a computer off. This is especially true as your computer ages and comes closer to the end of its expected life. By not cycling the power, you can protect older computers from failure, at least for a while. However, for younger computers, it may be more of a "don’t care" issue, as research has shown components in the teenage through adult years remain very stable, and don't show a likelihood of failure by conventional power cycling (turning the computer off at night). For new computers, there's the question of removing stress being an agent of slowing down aging, thus extending the time frame for early failure to occur beyond the normal warranty period. Using Both Options: Turn the Computer Off When New, and Leave On With Age Do what you can to mitigate environmental stress factors, such as operating temperature. This can be as simple as having a fan in the hot months to ensure air movement around your computer system. Use a normal turn on and turn off cycle; that is, turn the computer off when not in use during the original manufacturer's warranty period. This will help ensure all components are aged out under warranty to a time frame when failure rates fall to a low level. It also helps to ensure that any failure that may happen will occur under warranty, saving you some serious coin. Once you move beyond the warranty period, the components should have aged beyond the infant mortality time frame and entered their teenage years, when they're tough and can stand up to just about any reasonable amount of stress thrown at them. At this point, you can switch to a 24/7 operating mode, if you wish to. So, new computer, turn it on and off as needed. Teenage to adult, it's up to you; there's no real benefit either way. Senior, keep it on 24/7 to extend its life. When Running 24/7 Which is Better, Sleep or Hibernation? One possible problem with running your computer 24/7, even if it isn't actively being used, is that you may discover that your computer entered a hibernation mode that's extremely similar to turning your computer off and back on again. Depending on your computer and the OS it's running, it may support multiple types of power saving options. Generally speaking, sleep mode is designed to reduce power consumption while keeping the computer in a semi-operational state. In this mode, your computer spins down any hard drives and optical drives it may have. RAM is powered down to a lower activity state. Displays are usually dimmed, if not outright powered off. Processors run with a reduced clock rate or in a special low-level state. In sleep mode, the computer can usually continue to run some basic tasks, though not as speedily as in a normal state. Most open user apps are still loaded but are in a standby state. There are exceptions, depending on your OS, but you get the idea. Sleep mode conserves power while keeping the computer turned on. Hibernation, another version of reducing power consumption, varies a bit between Mac, Windows, and Linux OSes. In hibernation mode, apps that are running are put into a standby state, and then the content of RAM is copied to your computer's storage device. At that point, RAM and the storage devices are powered off. Most peripherals are put into standby mode, including the display. Once all data has been secured, the computer is essentially turned off. Restarting from hibernation mode isn't much different, at least as experienced by the components that make up your computer, than turning your computer on. As you can see, if you haven’t ensured that your computer won't enter its hibernation mode after some amount of time, you're not really keeping your computer on 24/7. So, you may not be realizing the effect you wanted to achieve by not turning your computer off. If your intent is to run your computer 24/7 to perform various processing tasks, you'll want to disable all sleep modes except for display sleep. You probably don’t need the display to be active to run any of the tasks. The method for using only display sleep is different for the various operating systems. Some OSes have another sleep mode that allows specified tasks to run while placing all remaining tasks in standby mode. In this mode, power is conserved but processes that need to be run are allowed to continue. In the Mac OS, this is known as App Nap. Windows has an equivalent known as Connected Standby, or Modern Standby in Windows 10. No matter what it's called, or the OS it runs on, the purpose is to conserve power while allowing some apps to run. In regard to running your computer 24/7, this type of sleep mode doesn't exhibit the type of power cycling seen in hibernation mode, so it could meet the needs of those who don't wish to turn their computers off. Leave the Computer On or Turn It Off: Final Thoughts If you're asking if it's safe to turn your computer on and off as needed, the answer is yes. It's not something I would worry about until the computer reaches old age. If you're asking if it's safe to leave a computer on 24/7, I would say the answer is also yes, but with a couple of caveats. You need to protect the computer from external stress events, such as voltage surges, lightning strikes, and power outages; you get the idea. Of course, you should be doing this even if you plan to turn the computer on and off, but the risk is slightly greater for computers left on 24/7, only because it's likely they'll be turned on when a severe event occurs, such as a summer thunderstorm rolling through your area.

    Blog Entry, Hardware, KnowledgeBase (KB)
  • Posted on September 20, 2017 9:35 am
    Joseph Forbes
    No comments

    Whether you're a home PC user or a network administrator, you always need a plan for when the unexpected happens to your computers and/or network. A Disaster Recovery Plan (DRP) is essential in helping to ensure that you don't get fired after a server gets fried in a fire, or in the case of the home user, that you don't get kicked out of the house when mamma discovers you've just lost years worth of irreplaceable digital baby photos. A DRP doesn't have to be overly complicated. You just need to cover the basic things that it will take to get back up and running again if something bad happens. Here are some items that should be in every good disaster recovery plan: 1. Backups, Backups, Backups! Most of us think about backups right after we've lost everything in a fire, flood, or burglary. We think to ourselves, "I sure hope I have a backup of my files somewhere". Unfortunately, wishing and hoping won't bring back dead files or keep your wife from flogging you about the head and neck after you've lost gigabytes of family photos. You need to have a plan for regularly backing up your critical files so that when a disaster occurs you can recover what was lost. There are dozens of online backup services available that will backup your files to an off-site location via a secure connection. If you don't trust "The Cloud" you can elect to keep things in-house by purchasing an external backup storage device such as a Drobo. Whichever method you choose, make sure you set a schedule to backup all your files at least once weekly, with incremental backups each night if possible. Additionally, you should periodically make a copy of your backup and store it off-site in a fire safe, safe deposit box, or somewhere other than where your computers reside. Off-site backups are important because your backup is useless if it's burned up in the same fire that just torched your computer. 2. Document Critical Information If you encounter a major disaster, you're going to loose a lot of information that may not be inside of a file. This information will be critical to getting back to normal and includes items such as: Make, model, and warranty information for all your computers and other peripherals Account names and passwords (for e-mail, ISP, wireless routers, wireless networks, admin accounts, System BIOS) Network settings (IP addresses of all PCs, firewall rules, domain info, server names) Software license information (list of installed software, license keys for re-installation, version info) Support phone numbers (for ISP, PC manufacturer, network administrators, tech support) 3. Plan for Extended Downtime If you're a network administrator you'll need to have a plan that covers what you will do if the downtime from the disaster is expected to last more than a few days. You'll need to identify possible alternate sites to house your servers if your facilities are going to be unusable for an extended period of time. Check with your management prior to looking into alternatives to get their buy-in. Ask them questions such as: How much downtime is tolerable to them based on their business needs? What is the restoration priority (which systems do they want back online first)? What is their budget for disaster recovery operations and preparation? 4. Plan for Getting Back to Normal You'll need transition plan for moving your files off of the loaner you borrowed and onto the new PC you bought with your insurance check, or for moving from your alternate site back to your original server room after its been restored to normal. Test and update your DRP regularly. Make sure you keep your DRP up-to-date with all the latest information (updated points of contact, software version information, etc). Check your backup media to make sure it is actually backing something up and not just sitting idle. Check the logs to make sure the backups are running on the schedule you setup. Again, your disaster recovery plan shouldn't be overly complicated. You want to make it useful and something that is always within arms reach. Keep a copy of it off-site as well. Now if I were you, I would go start backing up those baby pics ASAP!

    Blog Entry, DATA, Data Recovery
  • Posted on September 17, 2017 9:30 am
    Joseph Forbes
    No comments

    Whether you're managing disaster preparation activities for a small business or a large corporation, you need to plan for natural disasters because, as we all know, information technology and water don't mix well. Let's go over some basic steps you'll need to take to ensure that your network and IT investments survive in the event of a disaster such as a flood or hurricane. 1. Develop a Disaster Recovery Plan The key to successfully recovering from a natural disaster is to have a good disaster recovery plan in place before something bad happens. This plan should be periodically tested to ensure that all parties involved know what they are supposed to do during a disaster event. The National Institute of Standards and Technology (NIST) has excellent resources on how to develop disaster recovery plans. Check out NIST Special Publication 800-34 on Contingency Planning to find out how to get started developing a rock-solid disaster recovery plan. 2. Get Your Priorities Straight: Safety First. Obviously, protecting your people is the most important thing. Never put your network and servers ahead of keeping your staff safe. Never operate in an unsafe environment. Always ensure that facilities and equipment have been deemed safe by the proper authorities before any recovery or salvage operations begin. Once safety issues have been addressed, you should have a system restoration priority so you can focus on what it will take to stand up your critical infrastructure and servers at an alternate location. Have management identify which business functions they want back online first and then focus planning on restoring what is needed to ensure safe recovery of mission critical systems. 3. Label and Document Your Network and Equipment. Pretend that you just found out that a major storm is two days away and it is going to flood your building. Most of your infrastructure is in the basement of the building which means you are going to have to relocate the equipment elsewhere. The tear down process will likely be rushed so you need to have your network well documented so that you can resume operations at an alternate location. Accurate network diagrams are essential for guiding network technicians as they reconstruct your network at the alternate site. Label things as much as you can with straightforward naming conventions that everyone on your team understands. Keep a copy of all network diagram information at an offsite location. 4. Prepare to Move Your IT Investments to Higher Ground. Since our friend gravity likes to keep water at the lowest point possible, you'll want to plan to relocate your infrastructure equipment to higher ground in the event of a major flood. Make arrangements with your building manager to have a safe storage location on a non-flood prone floor where you can temporarily move network equipment that might be flooded in the event of a natural disaster. If the entire building is likely to be trashed or flooded, find an alternate site that is not in a flood zone. You can visit the FloodSmart.gov website and enter in the address of your potential alternate site to see if it is located in a flood zone or not. If it is in a high risk flood area, you may want to consider relocating your alternate site. Make sure your disaster recovery plan covers the logistics of who's going to move what, how they are going to do it, and when they are going to move operations to the alternate site.. Move the expensive stuff first (switches, routers, firewalls, servers) and least expensive stuff last (PCs and Printers). If you're designing a server room or data center, consider locating it in an area of your building that won't be prone to flooding such as a non-ground level floor, this will save you the headache of relocating equipment during a flood. 5. Make Sure You Have Good Backups Before a Disaster Strikes. If you don't have good backups to restore from then it won't matter if you have an alternate site because you won't be able to restore anything of value. Check to make sure your scheduled backups are working and check backup media to make sure it is actually capturing data. Be vigilant. Make sure that your administrators are reviewing backup logs and that backups are not silently failing.

    DATA, Hardware, Security
  • Posted on September 2, 2017 10:13 am
    Joseph Forbes
    No comments

    All Windows computers include features that protect the operating system from hackers, viruses, and various types of malware. There are also protections in place to prevent mishaps that are brought on by the users themselves, such as the unintentional installation of unwanted software or changes to crucial system settings. Most of these features have existed in some form for years. One of them, Windows Firewall, has always been a part of Windows and was included with XP, 7, 8, 8.1, and more recently, Windows 10. It’s enabled by default. Its job is to protect the computer, your data, and even your identity, and runs in the background all the time. But what exactly is a firewall and why is it necessary? To understand this, consider a real-world example. In the physical realm, a firewall is a wall designed specifically to stop or prevent the spread of existing or approaching flames. When a threatening fire reaches the firewall, the wall maintains its ground and protects what’s behind it. Windows Firewall does the same thing, except with data (or more specifically, data packets). One of its jobs is to look at what’s trying to come into (and go out of) the computer from web sites and email, and decide if that data is dangerous or not. If it deems the data acceptable, it lets it pass. Data that could be a threat to the stability of the computer or the information on it is denied. It is a line of defense, just as a physical firewall is. This, however, is a very simplistic explanation of a very technical subject. Why and How to Access Firewall Options Windows Firewall offers several settings that you can configure. For one, it’s possible to configure how the firewall performs and what it blocks and what it allows. You can manually block a program that’s allowed by default, such as Microsoft Tips or Get Office. When you block these programs you, in essence, disable them. If you’re not a fan of the reminders you get to buy Microsoft Office, or if the tips are distracting, you can make them disappear. You can also opt to let apps pass data through your computer that aren’t permitted by default. This often occurs with third-party apps you install like iTunes because Windows requires your permission to allow both installation and passage. But, the features can also be Windows-related such as the option to use Hyper-V to create virtual machines or Remote Desktop to access your computer remotely. You also have the option to turn off the firewall completely. Do this if you opt to use a third-party security suite, like the anti-virus programs offered by McAfee or Norton. These frequently ship as a free trial on new PCs and users often sign up. You should also disable the Windows Firewall if you’ve installed a free one (which I’ll discuss later in this article). If any of these are the case, read “How to Disable the Windows Firewall” for more information. Note: It is vitally important to keep a single firewall enabled and running, so don’t disable the Windows Firewall unless you have another in place and don't run multiple firewalls at the same time. When you’re ready to make changes to Windows Firewall, access the firewall options: Click in the Search area of the Taskbar. Type Windows Firewall. In the results, click Windows Firewall Control Panel. From the Windows Firewall area you can do several things. The option to Turn Windows Firewall On or Off is in the left pane. It’s a good idea to check here every now and then to see if the firewall is indeed enabled. Some malware, should it get by the firewall, can turn it off without your knowledge. Simply click to verify and then use the Back arrow to return to the main firewall screen. You can also restore the defaults if you’ve changed them. The option Restore Defaults, again in the left pane, offers access to these settings. How to Allow an App Through the Windows Firewall When you allow an app in Windows Firewall you choose to allow it to pass data through your computer based on whether you’re connected to a private network or a public one, or both. If you select only Private for the allow option, you can use the app or feature when connected to a private network, such as one in your home or office. If you choose Public, you can access the app while connected to a public network, such as a network in a coffee shop or hotel. As you’ll see here, you can also choose both. To allow an app through the Windows Firewall: Open the Windows Firewall. You can search for it from the Taskbar as detailed earlier. Click Allow an App or Feature Through Windows Firewall. Click Change Settings and type an administrator password if prompted. Locate the app to allow. It won’t have a check mark beside it. Click the checkbox(es) to allow the entry. There are two options Private and Public. Start with Private only and select Public later if you don’t get the results you want. Click OK. How to Block a Program with the Windows 10 Firewall The Windows Firewall allows some Windows 10 apps and features to pass data into and out of a computer without any user input or configuration. These include Microsoft Edge and Microsoft Photos, and necessary features like Core Networking and Windows Defender Security Center. Other Microsoft apps like Cortana might require you to give your explicit permissions when you first use them though. This opens the required ports in the firewall, among other things. We use the word “might” here because the rules can and do change, and as Cortana becomes more and more integrated it could be enabled by default in the future. That said, this means that other apps and features could be enabled that you do not want to be. For instance, Remote Assistance is enabled by default. This program allows a technician to remotely access your computer to help you resolve a problem if you agree to it. Even though this app is locked down and quite secure, some users do consider it an open security hole. If you’d rather close that option, you can block access for that feature. There are also third party apps to consider. It’s important to keep unwanted apps blocked (or possibly, uninstalled) if you don't use them. When working through the next few steps then, check for entries that involve file sharing, music sharing, photo editing, and so forth, and block those that don’t need access. If and when you use the app again, you’ll be prompted to allow the app through the firewall at that time. This keeps the app available should you need it, and is thus better than uninstalling in many instances. It also prevents you from accidentally uninstalling an app that the system needs to function properly. To block a program on a Windows 10 computer: Open the Windows Firewall. You can search for it from the Taskbar as detailed earlier. Click Allow and App or Feature Through Windows Firewall. Click Change Settings and type an administrator password if prompted. Locate the app to block. It will have a check mark beside it. Click the checkbox(es) to disallow the entry. There are two options Privateand Public. Select both. Click OK. Once you’ve done this, the apps you’ve selected are blocked based on the network types you’ve selected.   Consider a Free Third-Party Firewall If you would rather use a firewall from a third-party vendor, you can. Remember though, the Windows Firewall has a good track record and your wireless router, if you have one, does a good amount of work too, so you don’t have to explore any other options if you don’t want to. It’s your choice though, and if you want to try it out, here are a few free options: ZoneAlarm Free Firewall – ZoneAlarm has been around for a very long time and is a trusted name. It protects your computer on many levels from hiding open ports to real-time security updates. It’s easy to download and set up and doesn’t require a lot of attention once it’s running. Explore ZoneAlarm Free here. TinyWall – Simple to use, effective, and non-intrusive, this firewall is a good choice for those users with only a little experience but a healthy curiosity. Download TinyWall safely from CNet. Comodo Firewall - This firewall comes with a full security suite and is best for more advanced users. It includes automatic updates but not a lot of built-in help. Check out Comodo here. For more information about free firewalls, refer to this article "10 Free Firewall Programs". Whatever you decide to do, or not do, with the Windows Firewall, remember that you need a working and running firewall to protect your computer from malware, viruses, and other threats. It’s also important to check every now and then, perhaps once a month, that the firewall is engaged. If new malware gets by the firewall, it can disable it without your knowledge. If you forget to check though, it’s highly likely you’ll hear from Windows about it through a notification. Pay attention to any notification you see about the firewall and resolve those immediately; they'll appear in the notification area of the Taskbar on the far right side.

    Blog Entry, Internet, KnowledgeBase (KB)