Web Application Reconnaissance Process

In the ever-evolving landscape of cybersecurity, understanding the intricacies of web applications is crucial for identifying vulnerabilities and securing systems. The Web Application Reconnaissance Process serves as a foundational step in this journey, enabling security professionals to gather critical information about a web application before diving into deeper testing phases. This process involves various techniques and tools designed to uncover potential weaknesses and map out the application’s structure. By systematically analyzing and documenting the application’s components, security experts can better prepare for vulnerability assessments and penetration testing, ultimately strengthening the overall security posture. In this guide, we will explore the essential steps and methodologies involved in the web application reconnaissance process, highlighting best practices and effective strategies to enhance your approach to securing web applications.

“Mọi kiến thức trong bài viết chỉ phục vụ mục đích giáo dục và an toàn thông tin.
Không được sử dụng để tấn công hệ thống mà bạn không sở hữu hoặc không được phép kiểm thử.”

Setup

Install required tools:

Step 1: Subdomain Enumeration

Collect subdomains via Google using the following tools.

i) subfinder (https://github.com/projectdiscovery/subfinder)

$ subfinder -d google.com > subdomain.txt

ii) assetfinder (https://github.com/tomnomnom/assetfinder)

$ assetfinder -subs-only google.com > asf.txt

$ cat asf.txt subdomain.txt > subdomains.txt

Step 2: Subdomain Filtering Process

We utilize the HTTPX tool (https://github.com/projectdiscovery/httpx) to filter live subdomains since some entries in the collected subdomains.txt file might be inactive. This ensures we accurately identify and verify the active subdomains.

$ cat subdomains.txt | httpx > livesubs.txt

(The HTTPX tool processes the subdomains from the subdomains.txt file, filtering out inactive ones and saving the verified live subdomains in the livesubs.txt file for further reconnaissance.)

Step 3: Gathering Urls

Tools Applied in This Context:

— — — — — — — — — — — — — — — — –

Run this command in the specified order:

$ cat livesubs.txt | gau | tee gau.txt

$ cat livesubs.txt | waybackurls | tee wayback.txt

$ cat gau.txt wayback.txt > urls.txt

$ cat urls.txt | fff | tee > liveurls.txt

— — — — — — — — — — — — — — — — –

Step 4: Bug Hunting

The gf tool is employed to identify potentially vulnerable URLs in this context.(https://github.com/tomnomnom/gf)

— — — — — — — — — — — — — –

$ cat liveurls.txt | gf ssrf (for testing ssrf)

$ cat liveurls.txt | gf rce (for testing rce)

$ cat liveurls.txt | gf xss (for testing xss)

$ cat liveurls.txt | gf sqli (for testing sqli)

— — — — — — — — — — — — — — — — —

Several methods remain, including manual analysis and performing reconnaissance with the primary tool, Burp Suite, among others.

Sum up: All Commands Used

— — — — — — — — — — — — — — — — — — — –

$ subfinder -d target.com > subdomain.txt

$ assetfinder -subs-only target.com > asf.txt

$ cat subdomain.txt asf.txt | sort -u | tee subdomains.txt

$ cat subdomains.txt | httpx | tee livesubs.txt

$ cat livesubs.txt | gau | tee gau.txt

$ cat livesubs.txt | waybackurls | tee wayback.txt

$ cat gau.txt wayback.txt | sort -u | tee urls.txt

$ cat urls.txt | fff | tee liveUrls.txt

$ cat liveUrls.txt | gf ssrf | tee ssrf.txt

$ cat liveUrls.txt | gf sqli | tee sqli.txt

$ cat liveUrls.txt | gf xss | tee xss.txt

$ cat liveUrls.txt | gf redirect | tee redirect.txt

Ref: https://shubhdhungana.medium.com/web-application-reconnaissance-guide-cybersec-shubham-dhungana-17858c967e2b

Published by Nhat Truong

Hi

Leave a comment