When beginning Ethical Hacking , Reconnaissance may be viewed one of lesser interesting topics. But bare with it, as Reconnaissance can be very interesting and it’s part of the larger OSINT.
In this post we explore the different techniques Hackers use to get information. And What a Cybersecurity Team can do to prevent it from happening.
Typically the Ethical Hacker has approval to do what they do, and do not have to spend much time on “non-technical attack vectors” So it would not be necessary to spend much time on the Reconnaissance stage. However the Black Hat Hacker has to spend a lot of time on Reconnaissance as they have little knowledge of their target and need to gather as much information as possible to advance. Real attackers do not have the same code as an Ethical Hacker. Anonymity is the key for the attacker and the 50,000 foot view allows them to gather information with risk of raising any alarm with the Target. Therefore, the Ethical Hacker also needs to learn and be aware of such processes.
What is Reconnaissance?
When Ethical Hacking, reconnaissance phase it the first phase a Hacker looks to when identifying Targets to Exploit. Reconnaissance is often called “50,000-foot view. It can be compared to, stepping back and getting the big picture.
When attackers are targeting networks they need reconnaissance. Reconnaissance that helps them breach the target, reconnaissance that helps them keep anonymous. The more reconnaissance that the Hacker has before the directly interacts with the target then the more of a chance of success and anonymity they have. In the next section I will cover some of the techniques that can be used to gather reconnaissance, or create a profile. And to do this without actively engaging that target or its systems.
Aren’t search engines just wonderful things. The amount of reconnaissance one can get from them is truly a phenomenal thing. Unfortunately there is lots of information that Cyber Attackers can use to gather information around an attack. Lets take a look at what we can do with some of these search engines.
Google is the most famous search engine in the world. In it’s database there going to be some reconnaissance about the target, but it can be hard to find. Unless you know how. If you didn’t know, google has a lot of “keywords” that can be used to narrow down search filters. For example,
- intitle: only search in the title of the site.
- inurl: only search in the url.
- site: only search a specific site.
- ext: only look in files with these extensions.
For lots of examples of Google Dorking take a look at exploit-db which keeps a database of some of the more popular ones.
Shodan is a search engine for Internet-connected devices. Web search engines, such as Google and Bing, are great for finding websites. But what if you’re interested in measuring which countries are becoming more connected? Or if you want to know which version of Microsoft IIS is the most popular? Or you want to find the control servers for malware? Maybe a new vulnerability came out and you want to see how many hosts it could affect? Traditional web search engines don’t let you answer those questions.
Way Back Machine
Searching the history of a web site can reveal some interesting things. Perhaps some domains that never maintained, or software that was never updated. The site “archive.org” has a tool called the waybackmachine that saves a snapshot of the internet. There is even a great tool that can list all the urls that are captured by the waybackmachine for a site.
The waybackmachine has over 700 billion pages captured so you are sure to find something of interest in there.
People like to talk on social media. Sometimes too much. Using our Ethical Hacking skills we can gather reconnaissance by taking a look at social Media outlets that offer some sort of support groups. For example Reddit have sub-reddits on VMWare, Oracle, Linux, etc. And people love to help out using other techniques to identify some.
Looking at jobs listings for the target can give an insight into the technology being used. Linkedin has a lot of job posting and there is lots of valuable information on targets in there.
The results from DNS scans can be valuable to hackers to identify sub domains that could potentially be targeted. Sites such as dnsdumpster can provide invaluable information about sub domains of targets.
Another way of finding subdomains is through scanning or Fuzzing. There are different approaches for this. Some tools such as wfuzz, knock and nmap will just use lists against your systems local DNS server to find any subdomains for the target.
However tools such as Sublist3r will get sub domains by running a query on search engines such as Google, Yahoo, Bing, Baidu and Ask. Sublist3r also enumerates subdomains using Netcraft, Virustotal, ThreatCrowd, DNSdumpster and ReverseDNS.
Finally subbrute is a tool that can use a list of open DNS resolvers to actively query multiple DNS servers so it acts like a proxy and doesn’t hammer the local DNS servers. Sending 1000s of requests to a single DNS server can induce rate limiting which will slow down a scan. Using multipule open DNS Servers, like subbrute does, is a way of avoiding that.
Another way to find out information about a target is using online tools that are freely available to find out more information on a target.
Github is one of the top sites for storing code in the world. It also has a Content versioning system that means that there is a lot of history in there too. For example if something like passwords or api tokens or even AWS keys were accidentally committed to a project its likely that they are still there buried in the commits somewhere.
Tools such as trufflehog do a great job at unearthing hidden users, passwords and keys.
“Google Dorking” can be used to identify repos belonging to targets.
You can also chose to clone the git repositories locally in order to do a deep dive into it. A tools such as GithubCloner can do the job very well.
Once you advance and start to learn about all the tools and processes out there you can look to automation and bug bounty automation frameworks like Axiom.
When Ethical Hacking the critical take away from this post is to do the research to gather reconnaissance regarding the targets and their defences. By searching job postings, github, using passive detection, online search engines, you should get an idea of what kind of defensive systems are being used. The attacker will look to get this information and without it they are working blind and just guessing what technology is there.