While many businesses may feel that their networks are protected because their servers are backed-up regularly, the truth is that any business device which does not have anti-virus software installed presents a huge threat. This includes devices which have home versions of Windows and are not designed to operate in a business environment.
How business networks are exposed
One of the most important changes to IT in recent years are the number of devices which are connected to a network and the complexity involved in tracking all activity. In the past, this was not such a big problem because protecting against viruses was a matter of protecting systems against people. This meant understanding how a potential hacker may attempt to force entry into a system, or what vulnerabilities they may choose to focus on. This often meant the prevention of malware running executable files (.exe) and preventing the virus from communicating back with its source.
Today, hackers have outsourced this work to little pieces of software called ‘Robots’. These now do the heavy lifting, looking for security information and potential weaknesses to take back to their owner. These are known as robots, are they were invented by Google to index web pages.
What is a robot?
A ‘Robot’ is a piece of script which creates a robot.txt file by examining a web page (otherwise known as crawling). This robot.txt file contains information about where information shown on the website is coming from and how that data is organised on a server (also known as FTP or File Transfer Protocol). It shows directory names and can be used to map how information is being moved from one place to another.
A hacker can use standardised naming conventions to target specific files and expose known weaknesses in website applications and server environments. In the case of attacking a network, they may attempt to find a computer which is not protected and install keylogging software to gather passwords. Without Anti-virus software, this scenario is very possible.
How can this be stopped?
First, there is a simple and effective way to halt this type of breach (ie. From Robots) called a Robots Exclusion Protocol (REP). This is a piece of code which stops a website being indexed by automated scripts. Unfortunately this is an ‘all-or-nothing’ cure, meaning if active the website cannot be indexed by any robots, including friendly robots such as Google and Automated business directories.
Altering the standard naming convention of an FTP can halt automated programs from exposing weaknesses. Additionally this naming convention can be used to recognise potential attacks, ie. By setting up the server to record penetration tests on particular files on a server. Unfortunately this can also cause problems if not done correctly, causing potential issues from plugins, which rely on standard naming conventions to work.
One of the most effective ways to protect a secure web server is to run penetration and vulnerability tests across a network domain or server. We offer this service through our partners at Hacker Guardian, who prepare web servers to be totally robust and immune to web attacks, which is critical for ecommerce and any web-based payments management.
Is your network protected?
Much like a chain is only as strong as the weakest link, so too a network is only as strong as the weakest computer. This is why it is vital to have a consistent approach to security and ensure there are no weaknesses to expose. To ensure this, we work with partners like Symantec Cloud-based security, the most advanced anti-virus protection in the world as used by a 3rd of the worlds blue chip companies.