The Time Eater

SCR1P7K1DD13
6 min readFeb 23, 2021

--

Automating the manual workflow is a great task. This allows us to reduce the time it takes to perform tasks that may take a long time to complete. In penetration testing also we have a wide variety of tools to serve various purposes. For example, let’s consider Recon. If you want to perform recon on targets that have a large scope the initial step will be identifying its digital assets including subdomains, IP’s, etc…But if we don’t have the right tools to do this we will have to use search engine dorks and browse through all the results manually. Which might take ages to complete.

We use some tools to simplify these tasks and get them done in less time. Tools like subfinder or sublister come in handy during subdomain enumeration. These tools will produce only those results we wanted to see.

But the question is “are we using it efficiently”?

Suppose I want to enumerate subdomains of target.com and take a snap of all the identified subdomains. Rather than relying on a single tool, I tend to make use of multiple tools for the same purpose. Because different tools employ different methods so the output will also differ. For finding subdomains I might make use of sublister,subfinder, and Amass.

sublister -d target.com

Then I will merge the outputs from tools into a single file and sort it for uniqueness. After this process, we will be left with a good list of all the possible subdomains of target.com. The next step is to take the screenshot for that I make use of gowitness. But there is a problem, We can’t feed the output of enumeration tools to gowitness since it requires a fully-qualified URL as input. For that, we might need to add leading https:// to every line in the list using match and replace or by taking advantage of any other tool.

gowitness file -s output.txt

This will generate the output I needed but the problem is it consumes more time. Running tools from their respective directories, Finding subdomains then merging into a single one, sorting it, adding leading string, and finally taking the screenshot might take 5–10 minutes. Automated tools are made to reduce the time taken to perform a task and now we are spending time running these tools.

How to solve this?

Well, we can write a tool to run other tools. Yes, let’s see how we could have handled the above scenario just by running a single tool. First, let’s understand how the tool should work

  1. It should take the domain name as input.
  2. It should enumerate all the possible subdomains of the supplied domain and save the output.
  3. The output should be consolidated and sorted.
  4. The tool should take screenshots of the identified domains.
  5. All these outputs should be saved into specific folders generated at runtime.

Now that we have an idea about how the tools should work let’s see how we can make it possible. The tool will be developed using shell scripting.

Subscanner

Developing tools using shell scripting is easy since it can directly issue system commands once run.

Step#1:: Laying out the basics.

In this step, we will identify the basic commands to be executed. We will be making use of sublister and subfinder for subdomain enumeration, httpx for checking live websites, sort command for sorting the output file, and finally gowitness for screenshot taking. Without further ado Let’s start writing the code,

Subdomain Enumeration

#!/bin/sh

sublister -d target.com -t 50 -o output.txt & subfinder -d target.com >> output.txt

In this step, we are instructing the interpreter to execute both commands at the same time. i.e, both sublister and subfinder will be run at the same time(‘&’). Sublister will be run with 50 threads and the output will be saved to output.txt. Since sublister is writing into output.txt we can’t simply direct the subfinder to do the same because it will overwrite the existing data written by sublister. For that, we have added ‘>>’ i.e, we’re appending data to an existing file.

output: *.target.com (* can be anything)

Sorting

#!/bin/sh

sublister -d target.com -t 50 -o output.txt & subfinder -d target.com >> output.txt

sort -u -o output.txt output.txt

After getting the output.txt the next step is to sort the output and delete duplicates if exist. For that, we’ll make use sort command available in Linux. The -u flag directs the command to sort for unique values i.e, basically delete duplicate entries. The -o flag is for outputting to a file. In this case, we’ll be reading from and writing to the same file.

Adding leading string

#!/bin/sh

sublister -d target.com -t 50 -o output.txt & subfinder -d target.com >> output.txt

sort -u -o output.txt output.txt

cat output.txt | httpx -o output.txt

httpx is a great tool by projectdiscovery which is basically used to check whether the input is giving an HTTP response or not. We use it to check whether the input is running a server or not. But in this case, we will be using it to have leading https:// in our output.txt as the input to gowitness has to be a complete URL. To add that we feed our output.txt to httpx and save back its output to output.txt itself.

output: https://*.target.com

Taking Screenshot

#!/bin/sh

sublister -d target.com -t 50 -o output.txt & subfinder -d target.com >> output.txt

sort -u -o output.txt output.txt

cat output.txt | httpx -o output.txt

gowitness -T 5 file -s output.txt -d /screenshot

gowitness is a tool written in go that does the same as EyeWitness(if you are familiar with it). It basically takes a URL as an input and starts issuing requests to the same then it takes screenshots based on the response it got. It reads the lines from output.txt and saves the taken snaps to /screenshot folder. The -T 5 flag directs the tool to wait for 5 seconds until it time-out.

Step#2:: Reading inputs

Perfect. Now that we have the idea and commands to be executed let’s see how we can take command line arguments as input to our tool.

No matter what the tool name is it should be able to execute the command with a basic call which looks like

subscan.sh target.com

That is easy too. In the shell, if you pass an argument to a script i.e, like in the above scenario then target.com will occupy the first position of passed arguments. Which you can access simply via calling $1. If you have multiple values as arguments then each will occupy consecutive locations i.e, if the command looked like the following,

subscan.sh target1.com target2.com

Then $1 = target1.com and $2=target2.com and so on. Simple right?

This tool will be taking a single argument as input so the first thing we will have to ensure is that the user has provided at least a single argument. This can be achieved by incorporating the following code.

if [ “$#” -eq 0];then

echo “Missing argument”

exit 1

Here we’re checking whether any argument is being passed or not. If you haven’t passed any argument then it will print Missing argument. The next step is to define the else statement but before adding the code we displayed in Step#1 you should consider creating directories to save the output to. Because if you’re planning to use this tool in the long run and you’re saving all the output to a single directory then it will become messy and not convenient after a while. To save files to a convenient directory, we will output the files into $HOME/subscanner/$1.

Doesn’t make sense? Let’s understand what this directory structure is

$HOME is the home directory of the current user. subscanner is the new directory we are creating and $1 as you know will be the argument value that is being passed(target.com). So in effect if you run the command

subscan.sh target.com

Then the output will be saved under /home/user/subscanner/target.com/output.txt. So far so good let’s incorporate this logic into our code.

Step#3:: The tool

#!/bin/sh

if [ “$#” -eq 0];then

echo “Missing argument”;

exit 1

dir=$1

mkdir -p $HOME/subscanner/$dir;

mkdir -p $HOME/subscanner/$dir/screenshot;

sublister -d $1 -t 50 -o $HOME/subscanner/$dir/output.txt & subfinder -d $1 >> $HOME/subscanner/$dir/output.txt;

sort -u -o $HOME/subscanner/$dir/output.txt $HOME/subscanner/$dir/output.txt

cat $HOME/subscanner/$dir/output.txt | httpx -o $HOME/subscanner/$dir/output.txt

gowitness -T 5 file -s output.txt -d $HOME/subscanner/$dir/screenshot

Once run, this tool will scan a single target at a time and will save the list of unique subdomains under /home/user/subscanner/target.com and the respective screenshots under /home/user/subscanner/target.com/screenshot. Like this, you could automate every tool in your system you could even run these in cronjobs to schedule to repeat in certain intervals.

Note: All the mentioned tools(sublister,httpx,…)should be globally accessible in order for the new tool to work. If not try adding it to .bashrc

I would be happier if you could create a tool of your own after reading this.

If you’ve enjoyed reading leave a clap.

--

--

SCR1P7K1DD13
SCR1P7K1DD13

No responses yet