Footprinting Target Network | footprinting tools techniques | footprinting methodology





 Footprinting Target Network |  footprinting tools techniques | footprinting methodology In windows

Ethical hackers or pen testers use numerous tools  techniques to collect information about target. Recommended labs that will assist you learning various footprinting techniques include:


    Open Source Information Gathering using Windows Command Line

    Ping network administration utility used to test reachability of host on IP network  to measure round-tri time for messages sent from originating host to destination computer. The ping command sends ICMP echo request packets to target host  waits for ICMP response. During this request-response process, ping measures time from transmission to reception, known as round-trip time,  records any loss of packets. The ICMP type  code ping reply provide important insight into network.

    The nslookup network administration command-line tool generally used for querying Domain Name System (DNS) to obtain domain name or IP address mapping or for any other specific DNS record.

    The traceroute computer network diagnostic tool for displaying route (path)  measuring transit delays of packets across IP network

    Example of Footprinting Target Network in cmd (command prompt)

    1.       Find IP address for http://www.certifiedhacker.com

    2.Right-click Windows icon at lower-left comer of screen.

    3. Click Command Prompt from context menu to launch.

    4. Type ping www.certifiedhacker.com command prompt window,  press Enter to find its TP address. The displayed response should be similar to one shown following screenshot.


    5. Note target domain’s IP address result above:69.89.31.193. You also get information on Ping Statistics, such as packets sent, packets received, packets lost,  Approximate round-trip time.

    Note: The IP address may differ your lab environment.

    6. Now, find maximum frame size on network. In command

    prompt window, type ping www.certifiedhacker.com -ff-l 1500

    7. The response, Packet needs to be fragmented but DF set, means that frame too large to be on network  needs to be fragmented. Since we used -f switch with ping command, packet was not sent,  ping command returned this error

    8. Type ping w ww. certified ha eke r.com-f -I 1300.

    9. Observe that maximum packet size less than 1500 bytes  more than 1300 bytes.

    10. Now, try different values until you find maximum frame size. For instance, ping www.certifiedhacker.com-f -I 1473 replies with Packet needs to be fragmented but DF set,  ping www.certifiedhacker.com -f-l 1472 replies with successful ping. It indicates that 1472 bytes maximum frame size on this machine’s network.


    Note: The maximum frame size will differ depending upon on target network.

    11. Now, find out what happens when TTL (Time to lave) expires. Every frame on network has TTL defined. If TTL reaches 0, router discards packet This mechanism prevents loss of packets.

    12. In command prompt, type ping www.certifiedhacker.com -i 3. Ulis option sets time to live (-i) value as 3.




    Note: The maximum value you can set for TTL 255.

    FIGURE 1 JI: The ping command for www.certifiedhacker.com with -i 3 options

    13. Reply from 69.89.31.193: TTL expired transit means that router (69.89.31.193, students will have some other IP address) discarded frame, because its TTL has expired (reached 0).

    14. We will use ping command to emulate traceroute.

    15.Find traceroute from your PC to www.certifiedhacker.com using tracert command.

    16. The results you receive might differ from those this lab.

    17. launch new command prompt  type tracert

    www.certifiedhacker.com. This command traceroutes network configuration information of target domain.


    18. Minimize command prompt shown above  launch new command prompt. In command prompt window, type ping

    www.certifiedhacker.com -i 2 -n 1. The only difference from

    previous ping command that we are setting TTL to two attempt to check life span of packet.

    19. In command prompt window, type ping www.certifiedhacker.com i 3-n 1. This sets TTL value to 3.

    20. Observe that there reply coming from IP address 69.89.31.193  there no packet loss.

    Note: The result displayed above step might differ your lab environment.

    21. In command prompt, type ping www.certifiedhacker.com -i 4 -n 1. This sets time to live value as 4.





    22. Repeat above step until you reach IP address for

    www.certifiedhacker.com ( this case, 69.89.31.193).




    23. Here successful ping to reach www.certifiedhacker.com 17 hops. The output will be similar to trace route results.

     

    24. This implies that, at time to live value of 17, reply received from destination host (69.89.31.193).

    Note: This result might vary your lab environment

    25. Make note of all IP addresses from which you receive reply during ping to emulate tracert.

    26. Launch new command prompt, type nslookup,  press Enter. This displays default server  its address assigned to Windows Server 2016 machine.



    Note: The DNS server Address (8.8.S.8) may differ in your lab environment

    27.  In the nslookup interactive mode, type set type=a and press Enter.

    Setting the type as a configures nslookup to query for the TP address of a given domain.

    28. Type the target domain www.certifiedhacker.com and press Enter. This resolves the IP address and displays the result shown in the following screenshot:




    29. The first two lines result are:

    google-public-dns-.google.com  B.8.8.8

    This specifies that result was directed to default server hosted on local machine (Windows Server 2016) that resolves your requested domain.

    30. Thus, if response coming from your local machine’s server (Google), but not server that legitimately hosts domain

    www.certifiedhacker.com, it considered to benon-authoritative answer.

    www. cert ifi ed hac ker.co m 69.89.31.193

    31. Since result returned non-authoritative, you need to obtain domain's authoritative name server.

    Ethical Hacking  Countermeasures Copyright ©by EC-COlincl .•Ml Rights Reserved. Reproduction Strictly Prohibited.

     

    32. Type set type= cname  press Enter.

    The CNAME lookup done directly against domain’s authoritative name server  lists CNAME records for domain.

    33. Type certifiedhacker.com  press Enter.

    34. This returns domain’s authoritative name server, along with mail server address shown following screenshot:


    35. Since you have obtainedauthoritative name server, you will need to determine IP address of name server.

    36. Issue command set type=  press Enter.

    37. Type ns1 .bluehost.com (or primary name server that displayed your lab environment)  press Enter. This returns IP address of server as shown following screenshot:


    38. The authoritative name server stores records associated with domain. So, if attacker can determine authoritative name server (primary name server)  obtain its associated IP address, he/she might attempt to exploit server to perform attacks which include DoS, DDoS, URL Redirection  so on.





    Finding Company’s Sub-domains  using Sublist3r

    1.   Jxjg into Kali Linux machine with root/toor.
    2.   launch a command line terminal by clicking on Terminal icon from the taskbar.

    3. Install Sublist3r. To install Sublist3r, type apt update && apt -y install
    sublist3r and press Enter.
    Note: If Sublist3r is already installed skip to Step #5.
    4. Sublist3r will start installing as shown in the screen shot Wait until it completes the installation.
    5. Once the installation is completed, type sublist3r -h and press Enter. This command prints an overview of all options that are available to us with a description.
    6.  Type  subltst3r   -d  google.com   -t  3   -e   bing  and  press  Enter.  1  Iere  -d  is
    to search the subdomains, -t 3 with 3 threads, and -e bing is for search in the bing
    7. We have found the Subdomains that are present in googje.com,
    8. Now, www. google.com. Type
    sublist3r -d google.com -p 80 -e bing and press Enter.
    9. Sublist3r will list out all the subdomains of google.com with port 80 open as shown in the screen shot




    Gathering Personal Information using Online People Search Services

    Overview of Pi pl
    Pipl aggregates vast quantities of public data and organizes the information into easy- to-follow profiles. Information like name, email address, phone number, street address and user name can be easily found using this tool.
    Lab Tasks
    1. CZlick the Windows icon at the lower-left comer of the screen..


    2. The Start menu appears. In the Apps list, scroll down to find Google
    Chrome.
    3.   (Slick Google Chrome to launch the Chrome browser (or launch any other browser of your choice).
    4.   The Google Chrome browser window appears.
    5. In the browser, type https://pipl.com in the address bar and press Enter.
    6.   The Pipl home page appears 
    7.   To begin the search, enter the details of the person you want to search for in the Name, Email, Username or Phone fields and click the Search icon.
    8.   Pipl returns search results with the name you have entered.
    9.    Click any of the links for more information on the person.
    10. Pipl displays the complete information as shown in the below screen shot
    11. This will show career, education, usernames, phones, etc. information.
    12 To learn the places where the person visited, click any link in the Places section.



    Gathering Information from Linkedln using InSpy kali linux

    1. Ix>g into Kali linux machine with root/toor.
    2. launch a command line terminal by clicking on Terminal icon from the taskbar.
    3. Install InSpy, to install InSpy, type apt update && apt -y install inspy and press Enter.
    Note: If InSpy is already installed skip to Step #5.
    4. InSpy will start installing as shown in the screenshot, wait until it completes the installation.
    5. Once the installation is completed, type inspy -h and press Enter. This command prints an overview of all options that are available to us with a description.
    6. Type Is -Is /usr/share/inspy/wordlists/ and press Enter. This command
    will show you that the wordlist directory contains 4 different wordlists from which 2 contain the titles and are meant to be used in EmSpy mode. The other 2 lists are meant to be used in the TechSpy mode.
    7.    Now that we have got the wordlists files’ location, with the help of these files we can use them to search the employees for Google with their JinkedTn profiles.
    8.   Now type inspy -empspy /usr/share/inspy/wordlists/title-list-large.txt google and press Enter.
    9.   InSpy will list down all the employee profile details that are working in Google.




    Collecting Information About a Target Website using Firebug

    Overview of Firebug
    Firebug is an add-on tool for Mozilla Firefox. Running Firebug displays information like directory structure, internal URLs, cookies, session IDs, etc.
    Lab Tasks
    1. Ixjgin to Kali Linux machine with Username: root and Password: toor
    2.    launch Firefox browser from taskbar as shown in the screenshot.
    3.    Firefox main window appears, type http://www.moviescope.com in the address bar and press Enter.
    4.    CZlick the Firebug add-on on the top-right corner of the Navigation Toolbar to enable the Firebug control panel.

    5.  The Firebug panel appears at the lower end of the screen by default with Console tab as shown in the screenshot.


    6.   CZlick drop-down node from Security tab under Console. Check Warnings option.


    7.   Press F5 on the keyboard to refresh the webpage.
    8. The Security tab is under the Console section. Under this tab, Firebug displays all the issues related to the security of the website’s architecture, as shown in the following screenshot:


    9. The warning returned in the above screenshot states that the password fields are present on an insecure (http://) page. This vulnerability allows attackers to easily sniff the passwords in plain text
    Note: The warning results may vary depending on the websites you access.
    10. You can view the results in all the other tabs under the Console section, which might return useful information related to the website/web application.
    11. C21ick the Inspector tab in the Firebug UI. The Inspector section contains two tags: head and body, which contain scripts and text that might reveal the build of the website.
    Note: If you find this section empty, refresh the webpage.

    12. The head and body tags contain information related to the authentication of the username and password fields, such as the type of input that is to be given in the fields (numbers or characters, or combination of numbers and characters, etc.) which allows attackers to narrow down their exploitation techniques.
    For example, an attacker who knows that the password field takes only numbers can perform a brute force attack with only combinations of numbers (instead of applying random combinations of numbers, letters, and special characters).
    13. Expand these nodes and observe the script written to develop the webpage.
    14. Refer to tabs such as Rules, Computed, Animations and so on in the right pane in order to observe the script used to design the webpage.

    15.    The  Style  Editor  tab  provides  the  information  of  CSS  and  Script  of  the HTML and Java scripts that were used to design the webpage.
    16. Attackers could use these scripts to build a similar website (cloned website) which could be used to serve malicious purposes such as harvesting the data entered in specific fields.
    17. CZlick DOM (Document Object Model) tab in the Firebug control panel.
    18. This tab contains scripts written in various web technologies such as html5, jQuery, etc. This allows attackers to perform exploitation techniques on a specific version of a web application, which leads to exposure of sensitive information.
    19. Now, click the Network tab in the Firebug control panel.
    20.  By default, All tab under this section is selected.
    21. This tab displays the GET requests and responses for all the items in the Net section such as HTML, CSS, etc., along with their size, status, timeline, domain and remote IP.



    22. Under this tab, click a GET request related to moviescope.
    23. Under the Headers tab, expand the Response headers node.
    24. Observe the server name (IIS) and its version, along with the web application framework (ASP.NET) used to develop the website and its version. By learning this, attackers can target the vulnerabilities of that specific version in an attempt to exploit the web application.
    25. Attackers can use sniffing techniques to steal the cookies and manipulate them, thereby hijacking the session of an authenticated user without the need of entering legitimate credentials.
    26. By gaining the information described in the lab, an attacker can obtain the script related to a web page, identify the server-side technologies and manipulate the cookies, which allow them to perform fraudulent activities such as entering the web application, cloning a web page, hijacking a session, stealing database information, eta





    Extracting a Company’s Data using Web Data Extractor

    Overview of Web Data Extracting
    Web  Data  Extraction  is  the  process  of  extracting  data  from  web  pages.  It  is  also referred as Web Scraping or Web Data Mining
    Lab Tasks
    1. Navigate to \Web Spiders\Web Data Extractor and double-click wde.exe.
    2. If the Open File - Security Warning pop-up appears, click Run.
    3. Follow the wizard steps to install Web Data Extractor.
    4. On installation, launch Web Data Extractor from the Desktop.


    5. Web Data Extractor’s main window appears. CZlick New to start a new session.
    FIGURE 6,3: The Web Data Extractor main window
    6.   Clicking New opens the Session settings window.
    7.   Type a URL (http://www.certiffiedhacker.com) in the Starting URL field. (Zheck all the options as shown in the following screenshot, and click OK.
    •ssionsettings
    Source Ottsrteinks Filter. URL Piter: Text Filter. Data Parser Connection Search engnes Site / Directory / Groips URL list
    8. CZlick Start to initiate the Data Extraction.


    9. Web Data Extractor will start collecting information (emails, Phones, Faxes, etc.).


    10.  Once the data extraction process is completed, an Information dialog box appears. Click OK.
    11.    View the extracted information by clicking the tabs.
    FIGURE 6.8: Web Data Extractor Data Extraction windows
    12. Select Meta tags tab to view the URL, title, keywords, description, host, domain, etc.


    13. Select the Emails tab to view the email address, name, URL, Title, host, keywords density, etc. information related to emails.
    14. Select Phones tab to view the phone number, source, tag, etc.
    15.  Check for more information under the Faxes, Merged list, Uris, and Inactive sites tabs.
    16. Specify the session name in the Save session dialog box and click OK.
    17. An Information pop-up may appear with the message, You cannot save
    more than 10 records in Demo Version. Click OK.
    18. Select the Location and File format and click Save.
    21. By default, the session will be saved at CAProgram Files (x86)\
    We bE xtracto riDa ta\cert ifi ed h ac ke r.com.
    22. You can save information from the Emails, Phones, Faxes, Merged list, Uris and Inactive sites tabs.




    Mirroring Website using HTTrack Web Site Copier

    Overview of Web Site Mirroring
    Web site mirroring creates a replica of an existing site. It allows you to download a website to a local directory, analyze all directories, HTML, images, flash, videos and other files from the server on your computer.
    Lab Tasks

    2.    If the Open File - Security Warning pop-up appears, click Run.
    3.    Follow the wizard steps to install HTTrack Web Site Copier.
    4.      In  the  last  step  of  the  installation  wizard,  uncheck  View   history.txtfile options and click Finish.
    5.      The   WinHTTrack   Website   Copier   main   window   appears.   Click   OK   and then click Next to create a New Project.
    Note:  If  the  application  doesn’t  launch,  you  can  launch  it  manually  from  the Apps screen.


    6. Enter the name of the project in the New project name field. Select the Base path to store the copied files. Click Next.

    7. Enter www.certifiedhacker.com in the Web Addresses: (URL) field and click Set options.
    8. Click the Set options button to launch the WinHTTrack window.
    9. Click the Scan Rules tab and select the check boxes for the file types as shown
    10. Click Next.

    11. By default, the radio button will be selected for Please adjust
    connection     parameters     if     necessary,     then     press     FINISH     to     launch     the mirroring operation and check Disconnect when finished.
    12. Click Finish, to start mirroring the website.
    13. Site mirroring progress will be displayed as in the following screen shot:
    14.   WinllTTrack   displays   the   message   Mirroring    operation    complete,    once the site mirroring is completed. Click Browse Mirrored Website.
    15. The mirrored website for www.certifiedhacker.com launches. The URL displayed in the address bar indicates that the website’s image is stored on the local machine.
    Note:  If  the  webpage  does  not  open,  navigate  to  the  directory  where  you mirrored the website and open index.html with any browser.
    16.  Some websites are very large and it might take a long time to mirror the complete site.
    17. If you wish to stop the mirroring in progress, CZlick Cancel on the Site mirroring progress window.
    18. The site will work like a live hosted website.